AI Chatbots to Help with Mental Health Struggles

2024-04-02

00:00 / 00:00
复读宝 RABC v8.0beta 复读机按钮使用说明
播放/暂停
停止
播放时:倒退3秒/复读时:回退AB段
播放时:快进3秒/复读时:前进AB段
拖动:改变速度/点击:恢复正常速度1.0
拖动改变复读暂停时间
点击:复读最近5秒/拖动:改变复读次数
设置A点
设置B点
取消复读并清除AB点
播放一行
停止播放
后退一行
前进一行
复读一行
复读多行
变速复读一行
变速复读多行
LRC
TXT
大字
小字
滚动
全页
1
  • From VOA Learning English, this is the Health & Lifestyle report.
  • 2
  • The mental health chatbot Earkick greets users with a friendly-looking panda that could fit easily in a children's program.
  • 3
  • When users talk about anxiety, the panda gives the kind of comforting statements that a trained mental health professional, called a therapist, would say.
  • 4
  • Then it might suggest breathing exercises or give advice on how to deal with stress.
  • 5
  • Earkick is one of hundreds of free chatbots aimed at dealing with a mental health crisis among young people.
  • 6
  • But the co-founder of Earkick, Karin Andrea Stephan, says he and the other creators do not "feel comfortable" calling their chatbots a therapy tool.
  • 7
  • Whether these chatbots, or apps, provide a simple self-help tool or mental health treatment is important to the growing digital health industry.
  • 8
  • Since the apps do not claim to diagnose or treat medical conditions, they do not need approval from the Food and Drug Administration (or FDA).
  • 9
  • The industry's position is now coming under more careful examination with recent developments of chatbots powered by artificial intelligence (AI).
  • 10
  • The technology uses a large amount of data to copy human language.
  • 11
  • The upsides are clear: the chatbots are free; they are available 24 hours a day; and people can use them in private.
  • 12
  • Now for the downsides: there is limited data that the chatbots improve mental health; and they have not received FDA approval to treat conditions like depression.
  • 13
  • Vaile Wright is a psychologist and technology director with the American Psychological Association.
  • 14
  • She said users of these chatbots, "have no way to know whether they're actually effective."
  • 15
  • Wright added that the chatbots are not the same as traditional mental health treatment.
  • 16
  • But, she said, they could help some people with less severe mental and emotional problems.
  • 17
  • Earkick's website states that the app does not "provide any form of medical care, medical opinion, diagnosis or treatment."
  • 18
  • Some health lawyers say such claims are not enough.
  • 19
  • Glenn Cohen of Harvard Law School said, "If you're really worried about people using your app for mental health services, you want a disclaimer that's more direct..." He suggested, "This is just for fun."
  • 20
  • Still, chatbots are already playing a role due to an ongoing shortage of mental health professionals.
  • 21
  • Britain's National Health Service has begun offering a chatbot called Wysa to help with stress, anxiety and depression among young people.
  • 22
  • This includes those people waiting to see a therapist. Some health insurers, universities, and hospitals in the United States are offering similar programs.
  • 23
  • Dr. Angela Skrzynski is a family doctor in the American state of New Jersey.
  • 24
  • When she tells her patients how long it will take to see a therapist, she says they are usually very open to trying a chatbot.
  • 25
  • Her employer, Virtua Health, offers Woebot to some adult patients.
  • 26
  • Founded in 2017 by a Stanford-trained psychologist, Woebot does not use AI programs.
  • 27
  • The chatbot uses thousands of structured language models written by its staff and researchers.
  • 28
  • Woebot founder Alison Darcy says this rules-based model is safer for health care use.
  • 29
  • The company is testing generative AI models, but Darcy says there have been problems with the technology.
  • 30
  • She said, "We couldn't stop the large language models from... telling someone how they should be thinking, instead of facilitating the person's process."
  • 31
  • Woebot's finding was included in a research paper on AI chatbots published last year in Digital Medicine.
  • 32
  • The writers concluded that chatbots could help with depression in a short time.
  • 33
  • But there was no way to study their long-term effect on mental health.
  • 34
  • Ross Koppel of the University of Pennsylvania studies health information technology.
  • 35
  • He worries these chatbots could be used in place of treatment and medications.
  • 36
  • Koppel and others would like to see the FDA review and possibly regulate these chatbots.
  • 37
  • Dr. Doug Opel works at Seattle Children's Hospital.
  • 38
  • He said, "There's a whole host of questions we need to understand about this technology so we can ultimately do what we're all here to do: improve kids' mental and physical health."
  • 39
  • And that's the Health & Lifestyle report. I'm Anna Matteo.
  • 1
  • From VOA Learning English, this is the Health & Lifestyle report.
  • 2
  • The mental health chatbot Earkick greets users with a friendly-looking panda that could fit easily in a children's program.
  • 3
  • When users talk about anxiety, the panda gives the kind of comforting statements that a trained mental health professional, called a therapist, would say. Then it might suggest breathing exercises or give advice on how to deal with stress.
  • 4
  • Earkick is one of hundreds of free chatbots aimed at dealing with a mental health crisis among young people. But the co-founder of Earkick, Karin Andrea Stephan, says he and the other creators do not "feel comfortable" calling their chatbots a therapy tool.
  • 5
  • Whether these chatbots, or apps, provide a simple self-help tool or mental health treatment is important to the growing digital health industry. Since the apps do not claim to diagnose or treat medical conditions, they do not need approval from the Food and Drug Administration (or FDA).
  • 6
  • The use of AI chatbots
  • 7
  • The industry's position is now coming under more careful examination with recent developments of chatbots powered by artificial intelligence (AI). The technology uses a large amount of data to copy human language.
  • 8
  • The upsides are clear: the chatbots are free; they are available 24 hours a day; and people can use them in private.
  • 9
  • Now for the downsides: there is limited data that the chatbots improve mental health; and they have not received FDA approval to treat conditions like depression.
  • 10
  • Vaile Wright is a psychologist and technology director with the American Psychological Association. She said users of these chatbots, "have no way to know whether they're actually effective."
  • 11
  • Wright added that the chatbots are not the same as traditional mental health treatment. But, she said, they could help some people with less severe mental and emotional problems.
  • 12
  • Earkick's website states that the app does not "provide any form of medical care, medical opinion, diagnosis or treatment." Some health lawyers say such claims are not enough.
  • 13
  • Glenn Cohen of Harvard Law School said, "If you're really worried about people using your app for mental health services, you want a disclaimer that's more direct..." He suggested, "This is just for fun."
  • 14
  • Still, chatbots are already playing a role due to an ongoing shortage of mental health professionals.
  • 15
  • Shortage of mental health professionals
  • 16
  • Britain's National Health Service has begun offering a chatbot called Wysa to help with stress, anxiety and depression among young people.
  • 17
  • This includes those people waiting to see a therapist. Some health insurers, universities, and hospitals in the United States are offering similar programs.
  • 18
  • Dr. Angela Skrzynski is a family doctor in the American state of New Jersey. When she tells her patients how long it will take to see a therapist, she says they are usually very open to trying a chatbot. Her employer, Virtua Health, offers Woebot to some adult patients.
  • 19
  • Founded in 2017 by a Stanford-trained psychologist, Woebot does not use AI programs. The chatbot uses thousands of structured language models written by its staff and researchers.
  • 20
  • Woebot founder Alison Darcy says this rules-based model is safer for health care use. The company is testing generative AI models, but Darcy says there have been problems with the technology.
  • 21
  • She said, "We couldn't stop the large language models from... telling someone how they should be thinking, instead of facilitating the person's process."
  • 22
  • Woebot's finding was included in a research paper on AI chatbots published last year in Digital Medicine.
  • 23
  • The writers concluded that chatbots could help with depression in a short time. But there was no way to study their long-term effect on mental health.
  • 24
  • Ross Koppel of the University of Pennsylvania studies health information technology. He worries these chatbots could be used in place of treatment and medications. Koppel and others would like to see the FDA review and possibly regulate these chatbots.
  • 25
  • Dr. Doug Opel works at Seattle Children's Hospital. He said, "There's a whole host of questions we need to understand about this technology so we can ultimately do what we're all here to do: improve kids' mental and physical health."
  • 26
  • And that's the Health & Lifestyle report. I'm Anna Matteo.
  • 27
  • Matthew Perrone reported this story for the Associated Press from Washington, D.C. Anna Matteo adapted it for VOA Learning English.
  • 28
  • __________________________________________________
  • 29
  • Words in This Story
  • 30
  • chatbot - n. a computer program or character (as in a game) designed to mimic the actions of a person that is designed to converse with human beings
  • 31
  • anxiety - n. an abnormal and overwhelming sense of apprehension and fear often marked by physical signs
  • 32
  • diagnose - v. to recognize (something, such as a disease) by signs and symptoms
  • 33
  • artificial intelligence - n. the capability of computer systems or algorithms to imitate intelligent human behavior
  • 34
  • psychologist - n. a person who specializes in the study of mind and behavior or in the treatment of mental, emotional, and behavioral disorders
  • 35
  • diagnosis - n. the art or act of identifying a disease from its signs and symptoms
  • 36
  • facilitate - v. to help bring (something) about
  • 37
  • We want to hear from you. Do you have a similar expression in your language? In the Comments section, you can also practice using any of the expressions from the story. Our comment policy is here.